Recovering Joint Probability of Discrete Random Variables From Pairwise Marginals
نویسندگان
چکیده
Learning the joint probability of random variables (RVs) is cornerstone statistical signal processing and machine learning. However, direct nonparametric estimation for high-dimensional in general impossible, due to curse dimensionality. Recent work has proposed recover mass function (PMF) an arbitrary number RVs from three-dimensional marginals, leveraging algebraic properties low-rank tensor decomposition (unknown) dependence among RVs. Nonetheless, accurately estimating marginals can still be costly terms sample complexity, affecting performance this line practice sample-starved regime. Using also involves challenging problems whose tractability unclear. This puts forth a new framework learning PMF using only pairwise which naturally enjoys lower complexity relative third-order ones. A coupled nonnegative matrix factorization (CNMF) developed, its recovery guarantees under various conditions are analyzed. Our method features Gram--Schmidt (GS)-like algorithm that exhibits competitive runtime performance. The shown provably up bounded error finite iterations, reasonable conditions. It recently economical expectation maximization (EM) improve upon GS-like algorithm's output, thereby further lifting accuracy efficiency. Real-data experiments employed showcase effectiveness.
منابع مشابه
Probability Models.S2 Discrete Random Variables
For a particular decision situation, the analyst must assign a distribution to each random variable. One method is to perform repeated replications of the experiment. Statistical analysis provides estimates of the probability of each possible occurrence. Another, and often more practical method, is to identify the distribution to be one of the named distributions. It is much easier to estimate ...
متن کاملDiscrete Random Variables and Probability Distributions
Suppose a city’s traffic engineering department monitors a certain intersection during a one-hour period in the middle of the day. Many characteristics might be of interest to the observers, including the number of vehicles that enter the intersection, the largest number of vehicles in the left turn lane during a signal cycle, the speed of the fastest vehicle going through the intersection, the...
متن کاملDiscrete Random Variables and Probability Distributions
Suppose a city’s traffic engineering department monitors a certain intersection during a one-hour period in the middle of the day. Many characteristics might be of interest to the observers, including the number of vehicles that enter the intersection, the largest number of vehicles in the left turn lane during a signal cycle, the speed of the fastest vehicle going through the intersection, the...
متن کاملPairwise Independent Random Variables
In this lecture we discuss how to derandomize algorithms. We will see a brute force algorithm (enumeration) for derandomization. We will also see that some random algorithms do not need true randomness. Specifically, we will see an example where only pairwise random bits are needed. Next, we will see how we can generate pairwise random values and how this conservation on the amount of randomnes...
متن کاملCoverage probability of prediction intervals for discrete random variables
Prediction interval is awidely used tool in industrial applications to predict the distribution of future observations. The exact minimum coverage probability and the average coverage probability of the conventional prediction interval for a discrete random variable have not been accurately derived in the literature. In this paper, procedures to compute the exact minimumconfidence levels and th...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Signal Processing
سال: 2021
ISSN: ['1053-587X', '1941-0476']
DOI: https://doi.org/10.1109/tsp.2021.3090960